Lstm Encoder–decoder for Dialogue Response Generation

نویسندگان

  • Zhenlong Yu
  • Caixia Yuan
  • Xiaojie Wang
  • Guohua Yang
چکیده

This paper presents a dialogue response generator based on long short term memory (LSTM) neural networks for the SLG (Spoken Language Generation) pilot task of DSTC5 [1]. We first encode the input containing different number of semantic units as fixed-length semantic vector with a LSTM encoder. Then we decode the semantic vector with a variant of LSTM and generate corresponding text. In order to produce more flexible and context-aware response, we incorporate the historical dialogue acts when generating current utterance. Our experiments on DSTC5 data validate that the proposed LSTM-based generator significantly improves the quality of the generated responses compared to the baseline. Furthermore, it also yields comparable results to a state-of-the-art generator when evaluated on the same dataset.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Affective Neural Response Generation

Existing neural conversational models process natural language primarily on a lexico-syntactic level, thereby ignoring one of the most crucial components of human-to-human dialogue: its affective content. We take a step in this direction by proposing three novel ways to incorporate affective/emotional aspects into long short term memory (LSTM) encoder-decoder neural conversation models: (1) aff...

متن کامل

Natural Language Generation for Spoken Dialogue System using RNN Encoder-Decoder Networks

Natural language generation (NLG) is a critical component in a spoken dialogue system. This paper presents a Recurrent Neural Network based Encoder-Decoder architecture, in which an LSTM-based decoder is introduced to select, aggregate semantic elements produced by an attention mechanism over the input elements, and to produce the required utterances. The proposed generator can be jointly train...

متن کامل

Labeled Data Generation with Encoder-Decoder LSTM for Semantic Slot Filling

To train a model for semantic slot filling, manually labeled data in which each word is annotated with a semantic slot label is necessary while manually preparing such data is costly. Starting from a small amount of manually labeled data, we propose a method to generate the labeled data with using the encoderdecoder LSTM. We first train the encoder-decoder LSTM that accepts and generates the sa...

متن کامل

Sequence-to-Sequence Prediction of Vehicle Trajectory via LSTM Encoder-Decoder Architecture

In this paper, we propose a deep learning-based vehicle trajectory prediction technique which can generate the future trajectory sequence of the surrounding vehicles in real time. We employ the encoder-decoder architecture which analyzes the pattern underlying in the past trajectory using the long short term memory (LSTM)-based encoder and generates the future trajectory sequence using the LSTM...

متن کامل

Inferring and Executing Programs for Visual Reasoning Supplementary Material

In all experiments our program generator is an LSTM sequence-to-sequence model [9]. It comprises two learned recurrent neural networks: the encoder receives the naturallanguage question as a sequence of words, and summarizes the question as a fixed-length vector; the decoder receives this fixed-length vector as input and produces the predicted program as a sequence of functions. The encoder and...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2016